首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7829篇
  免费   1200篇
  国内免费   710篇
电工技术   289篇
综合类   885篇
化学工业   44篇
金属工艺   44篇
机械仪表   245篇
建筑科学   42篇
矿业工程   39篇
能源动力   20篇
轻工业   33篇
水利工程   27篇
石油天然气   59篇
武器工业   49篇
无线电   3684篇
一般工业技术   322篇
冶金工业   103篇
原子能技术   11篇
自动化技术   3843篇
  2024年   8篇
  2023年   63篇
  2022年   120篇
  2021年   157篇
  2020年   173篇
  2019年   157篇
  2018年   171篇
  2017年   266篇
  2016年   330篇
  2015年   386篇
  2014年   576篇
  2013年   502篇
  2012年   638篇
  2011年   635篇
  2010年   516篇
  2009年   463篇
  2008年   553篇
  2007年   544篇
  2006年   517篇
  2005年   471篇
  2004年   391篇
  2003年   341篇
  2002年   329篇
  2001年   259篇
  2000年   209篇
  1999年   185篇
  1998年   164篇
  1997年   156篇
  1996年   86篇
  1995年   104篇
  1994年   54篇
  1993年   54篇
  1992年   25篇
  1991年   23篇
  1990年   17篇
  1989年   15篇
  1988年   8篇
  1987年   9篇
  1986年   11篇
  1985年   16篇
  1984年   6篇
  1983年   4篇
  1982年   2篇
  1981年   3篇
  1980年   3篇
  1978年   3篇
  1965年   2篇
  1963年   3篇
  1956年   1篇
  1955年   1篇
排序方式: 共有9739条查询结果,搜索用时 19 毫秒
1.
Fast image codecs are a current need in applications that deal with large amounts of images. Graphics Processing Units (GPUs) are suitable processors to speed up most kinds of algorithms, especially when they allow fine-grain parallelism. Bitplane Coding with Parallel Coefficient processing (BPC-PaCo) is a recently proposed algorithm for the core stage of wavelet-based image codecs tailored for the highly parallel architectures of GPUs. This algorithm provides complexity scalability to allow faster execution at the expense of coding efficiency. Its main drawback is that the speedup and loss in image quality is controlled only roughly, resulting in visible distortion at low and medium rates. This paper addresses this issue by integrating techniques of visually lossless coding into BPC-PaCo. The resulting method minimizes the visual distortion introduced in the compressed file, obtaining higher-quality images to a human observer. Experimental results also indicate 12% speedups with respect to BPC-PaCo.  相似文献   
2.
针对多视角子空间聚类问题,提出基于隐式低秩稀疏表示的多视角子空间聚类算法(LLSMSC).算法构建多个视角共享的隐式结构,挖掘多视角之间的互补性信息.通过对隐式子空间的表示施加低秩约束和稀疏约束,捕获数据的局部结构和稀疏结构,使聚类结果更准确.同时,使用基于增广拉格朗日乘子交替方向最小化算法高效求解优化问题.在6个不同数据集上的实验验证LLSMSC的有效性和优越性.  相似文献   
3.
聚类混合型数据,通常是依据样本属性类别的不同分别进行评价。但这种将样本属性划分到不同子空间中分别度量的方式,割裂了样本属性原有的统一性;导致对样本个体的相似性评价产生了非一致的度量偏差。针对这一问题,提出以二进制编码样本属性,再由海明差异对属性编码施行统一度量的新的聚类算法。新算法通过在统一的框架内对混合型数据实施相似性度量,避免了对样本属性的切割,在此基础上又根据不同属性的性质赋予其不同的权重,并以此评价样本个体之间的相似程度。实验结果表明,新算法能够有效地聚类混合型数据;与已有的其他聚类算法相比较,表现出更好的聚类准确率及稳定性。  相似文献   
4.
在计算机系统运行以及研究环节中,会存在大量的规模效应,此类状况难以避免,要想有效解决此类问题,就需要利用分布式的处理方式,开展对文件系统的分析。文章对分布式多维联机分析过程(MOLAP)的数据模型进行了分析,从维编码的算法、映射归约(MapReduce)算法的实现、分析维的遍历算法等方面作深入探讨。  相似文献   
5.
In this letter, we address the problem of Direction of Arrival (DOA) estimation with nonuniform linear array in the context of sparse Bayesian learning (SBL) framework. The nonuniform array output is deemed as an incomplete-data observation, and a hypothetical uniform linear array output is treated as an unavailable complete-data observation. Then the Expectation-Maximization (EM) criterion is directly utilized to iteratively maximize the expected value of the complete-data log likelihood under the posterior distribution of the latent variable. The novelties of the proposed method lie in its capability of interpolating the actual received data to a virtual uniform linear array, therefore extending the achievable array aperture. Simulation results manifests the superiority of the proposed method over off-the-shelf algorithms, specially on circumstances such as low SNR, insufficient snapshots, and spatially adjacent sources.  相似文献   
6.
According to the circle-packing theorem, the packing efficiency of a hexagonal lattice is higher than an equivalent square tessellation. Consequently, in several contexts, hexagonally sampled images compared to their Cartesian counterparts are better at preserving information content. In this paper, novel mapping techniques alongside the wavelet compression scheme are presented for hexagonal images. Specifically, we introduce two tree-based coding schemes, referred to as SBHex (spirally-mapped branch-coding for hexagonal images) and BBHex (breadth-first block-coding for hexagonal images). Both of these coding schemes respect the geometry of the hexagonal lattice and yield better compression results. Our empirical results show that the proposed algorithms for hexagonal images produce better reconstruction quality at low bits per pixel representations compared to the tree-based coding counterparts for the Cartesian grid.  相似文献   
7.
In this article, we introduce a new bi-directional dual-relay selection strategy with its bit error rate (BER) performance analysis. During the first step of the proposed strategy, two relays out of a set of N relay-nodes are selected in a way to optimize the system performance in terms of BER, based on the suggested algorithm which checks if the selected relays using the max-min criterion are the best ones. In the second step, the chosen relay-nodes perform an orthogonal space-time coding scheme using the two-phase relaying protocol to establish a bi-directional communication between the communicating terminals, leading to a significant improvement in the achievable coding and diversity gain. To further improve the overall system performance, the selected relay-nodes apply also a digital network coding scheme. Furthermore, this paper discusses the analytical approximation of the BER performance of the proposed strategy, where we prove that the analytical results match almost perfectly the simulated ones. Finally, our simulation results show that the proposed strategy outperforms the current state-of-the-art ones.  相似文献   
8.
Electroencephalogram (EEG) signal processing has emerged as a critical problem for biometric applications due to its real-time requirement. While compressive sensing is an efficient method for signal compression, its application in EEG signal processing is limited due to its noise unawareness during transmission and time-consuming reconstruction procedure. In this paper, we propose a noise-aware sparse Bayesian learning approach with block structure (NA-BSBL) to achieve higher efficiency on data compression, reconstruction and classification. By applying novel structure for parameter and introducing the Mahalanobis Distance, our approach achieves an almost 20% reconstruction performance lift and 10% accuracy lift under noise condition. For further application of reconstructed EEG signal, we extract both the spatial and frequency domain features for classification. Experimental results show that the proposed approach can achieve 94% classification accuracy with 16% speed up compared with the conventional approach.  相似文献   
9.
As an extension of the High Efficiency Video Coding (HEVC) standard, 3D-HEVC requires to encode multiple texture views and depth maps, which inherits the same quad-tree coding structure as HEVC. Due to the distinct properties of texture views and depth maps, existing fast intra prediction approaches were presented for the coding of texture views and depth maps, respectively. To further reduce the coding complexity of 3D-HEVC, a self-learning residual model-based fast coding unit (CU) size decision approach is proposed for the intra coding of both texture views and depth maps. Residual signal, which is defined as the difference between the original luminance pixel and the optimal prediction luminance pixel, is firstly extracted from each CU. Since residue signal is strongly correlated with the optimal CU partition, it is used as the feature of each CU. Then, a self-learning residual model is established by intra feature learning, which iteratively learns the features of the previously encoded coding tree unit (CTU) generated by itself. Finally, a binary classifier is developed with the self-learning residual model to early terminate CU size decision of both texture views and depth maps. Experimental results show the proposed fast intra CU size decision approach achieves 33.3% and 49.3% encoding time reduction on average for texture views and depth maps with negligible loss of overall video quality, respectively.  相似文献   
10.
An original wireless video transmission scheme called SoftCast has been recently proposed to deal with the issues encountered in conventional wireless video broadcasting systems (e.g. cliff effect). In this paper, we evaluate and optimize the performance of the SoftCast scheme according to the transmitted video content. Precisely, an adaptive coding mechanism based on GoP-size adaptation, which takes into account the temporal information fluctuations of the video, is proposed. This extension denoted Adaptive GoP-size mechanism based on Content and Cut detection for SoftCast (AGCC-SoftCast) significantly improves the performance of the SoftCast scheme. It consists in modifying the GoP-size according to the shot changes and the spatio-temporal characteristics of the transmitted video. When hardware capacities, such as buffer or processor performance are limited, an alternative method based only on the shot changes detection (AGCut-SoftCast) is also proposed. Improvements up to 16 dB for the PSNR and up to 0.55 for the SSIM are observed with the proposed solutions at the cut boundaries. In addition, temporal visual quality fluctuations are reduced under 1dB in average, showing the effectiveness of the proposed methods.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号